376 research outputs found

    Optimal Memoryless Encoding for Low Power Off-Chip Data Buses

    Full text link
    Off-chip buses account for a significant portion of the total system power consumed in embedded systems. Bus encoding schemes have been proposed to minimize power dissipation, but none has been demonstrated to be optimal with respect to any measure. In this paper, we give the first provably optimal and explicit (polynomial-time constructible) families of memoryless codes for minimizing bit transitions in off-chip buses. Our results imply that having access to a clock does not make a memoryless encoding scheme that minimizes bit transitions more powerful.Comment: Proceedings of the 2006 IEEE/ACM international Conference on Computer-Aided Design (San Jose, California, November 05 - 09, 2006). ICCAD '06. ACM, New York, NY, 369-37

    Anti-Pasch optimal packings with triples

    Get PDF
    It is shown that for v ≠ 6, 7, 10, 11, 12, 13, there exists an optimal packing with triples on v points that contains no Pasch configurations. Furthermore, for all v ≡ 5 (mod 6), there exists a pairwise balanced design of order v, whose blocks are all triples apart from a single quintuple, and that has no Pasch configurations amongst its triples

    Set-Codes with Small Intersections and Small Discrepancies

    Full text link
    We are concerned with the problem of designing large families of subsets over a common labeled ground set that have small pairwise intersections and the property that the maximum discrepancy of the label values within each of the sets is less than or equal to one. Our results, based on transversal designs, factorizations of packings and Latin rectangles, show that by jointly constructing the sets and labeling scheme, one can achieve optimal family sizes for many parameter choices. Probabilistic arguments akin to those used for pseudorandom generators lead to significantly suboptimal results when compared to the proposed combinatorial methods. The design problem considered is motivated by applications in molecular data storage and theoretical computer science

    Entanglement-assisted quantum low-density parity-check codes

    Get PDF
    This paper develops a general method for constructing entanglement-assisted quantum low-density parity-check (LDPC) codes, which is based on combinatorial design theory. Explicit constructions are given for entanglement-assisted quantum error-correcting codes (EAQECCs) with many desirable properties. These properties include the requirement of only one initial entanglement bit, high error correction performance, high rates, and low decoding complexity. The proposed method produces infinitely many new codes with a wide variety of parameters and entanglement requirements. Our framework encompasses various codes including the previously known entanglement-assisted quantum LDPC codes having the best error correction performance and many new codes with better block error rates in simulations over the depolarizing channel. We also determine important parameters of several well-known classes of quantum and classical LDPC codes for previously unsettled cases.Comment: 20 pages, 5 figures. Final version appearing in Physical Review

    Solution to the Mean King's problem with mutually unbiased bases for arbitrary levels

    Get PDF
    The Mean King's problem with mutually unbiased bases is reconsidered for arbitrary d-level systems. Hayashi, Horibe and Hashimoto [Phys. Rev. A 71, 052331 (2005)] related the problem to the existence of a maximal set of d-1 mutually orthogonal Latin squares, in their restricted setting that allows only measurements of projection-valued measures. However, we then cannot find a solution to the problem when e.g., d=6 or d=10. In contrast to their result, we show that the King's problem always has a solution for arbitrary levels if we also allow positive operator-valued measures. In constructing the solution, we use orthogonal arrays in combinatorial design theory.Comment: REVTeX4, 4 page

    Impact evaluation of different cash-based intervention modalities on child and maternal nutritional status in Sindh Province, Pakistan, at 6 mo and at 1 y: A cluster randomised controlled trial

    Get PDF
    BACKGROUND: Cash-based interventions (CBIs), offer an interesting opportunity to prevent increases in wasting in humanitarian aid settings. However, questions remain as to the impact of CBIs on nutritional status and, therefore, how to incorporate them into emergency programmes to maximise their success in terms of improved nutritional outcomes. This study evaluated the effects of three different CBI modalities on nutritional outcomes in children under 5 y of age at 6 mo and at 1 y. METHODS AND FINDINGS: We conducted a four-arm parallel longitudinal cluster randomised controlled trial in 114 villages in Dadu District, Pakistan. The study included poor and very poor households (n = 2,496) with one or more children aged 6–48 mo (n = 3,584) at baseline. All four arms had equal access to an Action Against Hunger–supported programme. The three intervention arms were as follows: standard cash (SC), a cash transfer of 1,500 Pakistani rupees (PKR) (approximately US14;1PKR=US14; 1 PKR = US0.009543); double cash (DC), a cash transfer of 3,000 PKR; or a fresh food voucher (FFV) of 1,500 PKR; the cash or voucher amount was given every month over six consecutive months. The control group (CG) received no specific cash-related interventions. The median total household income for the study sample was 8,075 PKR (approximately US$77) at baseline. We hypothesized that, compared to the CG in each case, FFVs would be more effective than SC, and that DC would be more effective than SC—both at 6 mo and at 1 y—for reducing the risk of child wasting. Primary outcomes of interest were prevalence of being wasted (weight-for-height z-score [WHZ] < −2) and mean WHZ at 6 mo and at 1 y. The odds of a child being wasted were significantly lower in the DC arm after 6 mo (odds ratio [OR] = 0.52; 95% CI 0.29, 0.92; p = 0.02) compared to the CG. Mean WHZ significantly improved in both the FFV and DC arms at 6 mo (FFV: z-score = 0.16; 95% CI 0.05, 0.26; p = 0.004; DC: z-score = 0.11; 95% CI 0.00, 0.21; p = 0.05) compared to the CG. Significant differences on the primary outcome were seen only at 6 mo. All three intervention groups showed similar significantly lower odds of being stunted (height-for-age z-score [HAZ] < −2) at 6 mo (DC: OR = 0.39; 95% CI 0.24, 0.64; p < 0.001; FFV: OR = 0.41; 95% CI 0.25, 0.67; p < 0.001; SC: OR = 0.36; 95% CI 0.22, 0.59; p < 0.001) and at 1 y (DC: OR = 0.53; 95% CI 0.35, 0.82; p = 0.004; FFV: OR = 0.48; 95% CI 0.31, 0.73; p = 0.001; SC: OR = 0.54; 95% CI 0.36, 0.81; p = 0.003) compared to the CG. Significant improvements in height-for-age outcomes were also seen for severe stunting (HAZ < −3) and mean HAZ. An unintended outcome was observed in the FFV arm: a negative intervention effect on mean haemoglobin (Hb) status (−2.6 g/l; 95% CI −4.5, −0.8; p = 0.005). Limitations of this study included the inability to mask participants or data collectors to the different interventions, the potentially restrictive nature of the FFVs, not being able to measure a threshold effect for the two different cash amounts or compare the different quantities of food consumed, and data collection challenges given the difficult environment in which this study was set. CONCLUSIONS: In this setting, the amount of cash given was important. The larger cash transfer had the greatest effect on wasting, but only at 6 mo. Impacts at both 6 mo and at 1 y were seen for height-based growth variables regardless of the intervention modality, indicating a trend toward nutrition resilience. Purchasing restrictions applied to food-based voucher transfers could have unintended effects, and their use needs to be carefully planned to avoid this

    Two-stage stochastic minimum s − t cut problems: Formulations, complexity and decomposition algorithms

    Get PDF
    We introduce the two‐stage stochastic minimum s − t cut problem. Based on a classical linear 0‐1 programming model for the deterministic minimum s − t cut problem, we provide a mathematical programming formulation for the proposed stochastic extension. We show that its constraint matrix loses the total unimodularity property, however, preserves it if the considered graph is a tree. This fact turns out to be not surprising as we prove that the considered problem is NP-hard in general, but admits a linear time solution algorithm when the graph is a tree. We exploit the special structure of the problem and propose a tailored Benders decomposition algorithm. We evaluate the computational efficiency of this algorithm by solving the Benders dual subproblems as max-flow problems. For many tested instances, we outperform a standard Benders decomposition by two orders of magnitude with the Benders decomposition exploiting the max-flow structure of the subproblems

    Constructing interaction test suites with greedy algorithms

    Full text link
    Combinatorial approaches to testing are used in several fields, and have recently gained momentum in the field of software testing through software interaction testing. One-test-at-a-time greedy algorithms are used to automatically construct such test suites. This paper discusses basic criteria of why greedy algorithms have been appropriate for this test gen-eration problem in the past and then expands upon how greedy algorithms can be utilized to address test suite pri-oritization

    Contextual equipoise: a novel concept to inform ethical implications for implementation research in low-income and middle-income countries

    Get PDF
    The call for universal health coverage requires the urgent implementation and scale-up of interventions that are known to be effective, in resource-poor settings. Achieving this objective requires high-quality implementation research (IR) that evaluates the complex phenomenon of the influence of context on the ability to effectively deliver evidence-based practice. Nevertheless, IR for global health is failing to apply a robust, theoretically driven approach, leading to ethical concerns associated with research that is not methodologically sound. Inappropriate methods are often used in IR to address and report on context. This may result in a lack in understanding of how to effectively adapt the intervention to the new setting and a lack of clarity in conceptualising whether there is sufficient evidence to generalise findings from previous IR to a new setting, or if a randomised controlled trial (RCT) is needed. Some of the ethical issues arising from this shortcoming include poor-quality research that may needlessly expose vulnerable participants to research that has not been adapted to suit local needs and priorities, and the inappropriate use of RCTs that denies participants in the control arm access to treatment that is effective within the local context. To address these concerns, we propose a complementary approach to clinical equipoise for IR, known as contextual equipoise. We discuss challenges in the evaluation of context and also with assessing the certainty of evidence to justify an RCT. Finally, we describe methods that can be applied to improve the evaluation and reporting of context and to help understand if contextual equipoise can be justified or if significant adaptations are required. We hope our analysis offers helpful insight to better understand and ensure that the ethical principle of beneficence is upheld in the real-world contexts of IR in low-resource settings
    corecore